Eecient Locally Weighted Polynomial Regression Predictions 1 Locally Weighted Polynomial Regression
نویسنده
چکیده
Locally weighted polynomial regression (LWPR) is a popular instance-based algorithm for learning continuous non-linear mappings. For more than two or three inputs and for more than a few thousand dat-apoints the computational expense of predictions is daunting. We discuss drawbacks with previous approaches to dealing with this problem, and present a new algorithm based on a multiresolution search of a quickly-constructible augmented kd-tree. Without needing to rebuild the tree, we can make fast predictions with arbitrary local weight-ing functions, arbitrary kernel widths and arbitrary queries. The paper begins with a new, faster, algorithm for exact LWPR predictions. Next we introduce an approximation that achieves up to a two-orders-of-magnitude speedup with negligible accuracy losses. Increasing a certain approximation parameter achieves greater speedups still, but with a correspondingly larger accuracy degradation. This is nevertheless useful during operations such as the early stages of model selection and locating optima of a t-ted surface. We also show how the approximations can permit real-time query-speciic optimization of the kernel width. We conclude with a brief discussion of potential extensions for tractable instance-based learning on datasets that are too large to t in a com-puter's main memory. Locally weighted polynomial regression (LWPR) is a form of instance-based (a.k.a memory-based) algorithm for learning continuous non-linear mappings from real-valued input vectors to real-valued output vectors. It is particularly appropriate for learning complex highly non-linear functions of up to about 30 inputs from noisy data. Popularized in the statistics literature in the past decades (Cleveland and Delvin, 1988; Grosse, 1989; Atkeson et al., 1997a) it is enjoying increasing use in applications such as learning robot dynamics (Moore, 1992; Schaal and Atke-son, 1994) and learning process models. Both classical and Bayesian linear regression analysis tools can be extended to work in the locally weighted framework (Hastie and Tibshirani, 1990), providing conn-dence intervals on predictions, on gradient estimates and on noise estimates|all important when a learned mapping is to be used by a controller (Atkeson et al., 1997b; Schneider, 1997). Let us review LWPR. We begin with linear regression on one input and one output. Global linear regression (left of Figure 1) nds the line that minimizes the sum squared residuals. If this is represented as ^ y(x) = 0 + 1 x (1) then 0 and 1 are found that minimize N X k=1 (y k ? ^ y(x k)) 2 = N X k=1 (y k ? 0 ? …
منابع مشابه
Efficient Locally Weighted Polynomial Regression Predictions
Locally weighted polynomial regression (LWPR) is a popular instance-based algorithm for learning continuous non-linear mappings. For more than two or three inputs and for more than a few thousand datapoints the computational expense of predictions is daunting. We discuss drawbacks with previous approaches to dealing with this problem, and present a new algorithm based on a multiresolution searc...
متن کاملMultivariate Locally Weighted Polynomial Fitting and Partial Derivative Estimation
Nonparametric regression estimator based on locally weighted least squares fitting has been studied by Fan and Ruppert and Wand. The latter paper also studies, in the univariate case, nonparametric derivative estimators given by a locally weighted polynomial fitting. Compared with traditional kernel estimators, these estimators are often of simpler form and possess some better properties. In th...
متن کاملLocally Weighted Polynomial Estimation of Spatial Precipitation
We demonstrate the utility of locally weighted polynomial regression, a nonparametric technique for surface estimation discussed in Lall et al. (1995), for the spatial estimation of precipitation surface, with data related to the Chernobyl nuclear power plant accident. The method uses multivariate, locally weighted polynomial regression with temperature or precipitation as the dependent variabl...
متن کاملE cient Locally Weighted Polynomial Regression Predictions
Locally weighted polynomial regression LWPR is a popular instance based al gorithm for learning continuous non linear mappings For more than two or three in puts and for more than a few thousand dat apoints the computational expense of pre dictions is daunting We discuss drawbacks with previous approaches to dealing with this problem and present a new algorithm based on a multiresolution search...
متن کاملLocally weighted regression models for surrogate-assisted design optimization
Locally weighted regression combines the advantages of polynomial regression and kernel smoothing. We present three ideas for appropriate and effective use of LOcally WEighted Scatterplot Smoothing (LOWESS) models for surrogate optimization. First, a method is proposed to reduce the computational cost of LOWESS models. Second, a local scaling coefficient is introduced to adapt LOWESS models to ...
متن کامل